A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications

نویسندگان

  • Arnaud Marsiglietti
  • Victoria Kostina
چکیده

We derive a lower bound on the differential entropy of a log-concave random variable X in terms of the p-th absolute moment of X. The new bound leads to a reverse entropy power inequality with an explicit constant, and to new bounds on the rate-distortion function and the channel capacity. Specifically, we study the rate-distortion function for log-concave sources and distortion measure |x− x̂|, and we establish that the difference between the rate distortion function and the Shannon lower bound is at most log(2 √ πe) ≈ 2.5 bits, independently of r and the target distortion d. For mean-square error distortion, the difference is at most log( √ 2πe) ≈ 2 bits, regardless of d. The bounds can be further strengthened if the source, in addition to being log-concave, is symmetric. In particular, we establish that for mean-square error distortion, the difference is at most log( √ πe) ≈ 1.5 bits, regardless of d. We also provide bounds on the capacity of memoryless additive noise channels when the noise is log-concave. We show that the difference between the capacity of such channels and the capacity of the Gaussian channel with the same noise power is at most log( √ 2πe) ≈ 2 bits, and at most log( √ πe) ≈ 1.5 bits if the noise is symmetric log-concave. Our results generalize to the case of vector X with possibly dependent coordinates, and to γ-concave random variables. Our proof technique leverages tools from convex geometry.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A reverse entropy power inequality for log-concave random vectors

We prove that the exponent of the entropy of one dimensional projections of a log-concave random vector defines a 1/5-seminorm. We make two conjectures concerning reverse entropy power inequalities in the log-concave setting and discuss some examples. 2010 Mathematics Subject Classification. Primary 94A17; Secondary 52A40, 60E15.

متن کامل

Entropy Jumps for Radially Symmetric Random Vectors

We establish a quantitative bound on the entropy jump associated to the sum of independent, identically distributed (IID) radially symmetric random vectors having dimension greater than one. Following the usual approach, we first consider the analogous problem of Fisher information dissipation, and then integrate along the Ornstein-Uhlenbeck semigroup to obtain an entropic inequality. In a depa...

متن کامل

On the entropy power inequality for the Rényi entropy of order [0, 1]

Using a sharp version of the reverse Young inequality, and a Rényi entropy comparison result due to Fradelizi, Madiman, and Wang, the authors are able to derive a Rényi entropy power inequality for log-concave random vectors when Rényi parameters belong to (0, 1). Furthermore, the estimates are shown to be somewhat sharp.

متن کامل

Kahane-Khinchin type Averages

We prove a Kahane-Khinchin type result with a few random vectors, which are distributed independently with respect to an arbitrary log-concave probability measure on Rn. This is an application of small ball estimate and Chernoff’s method, that has been recently used in the context of Asymptotic Geometric Analysis in [1], [2].

متن کامل

Some properties of the parametric relative operator entropy

The notion of entropy was introduced by Clausius in 1850, and some of the main steps towards the consolidation of the concept were taken by Boltzmann and Gibbs. Since then several extensions and reformulations have been developed in various disciplines with motivations and applications in different subjects, such as statistical mechanics, information theory, and dynamical systems. Fujii and Kam...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Entropy

دوره 20  شماره 

صفحات  -

تاریخ انتشار 2018